256 research outputs found

    Data management for moulded ceramics and digital image comparison: a case study of Roman terra cotta figurines

    Get PDF
    Using a technique of image comparison on the one hand and data management on the other, we have developed a system that takes into account the singularity of moulded ceramics and allows us to approach the figurines from various points of view such as iconography, archaeological contexts, fabric and painting. The system is receptive to products of different economical regions and guarantees the objectivity of the dating process. A total of 3200 figurines have been described, measured and photographed to allow comparison of the digital images. The analysis enabled us to establish a relative chronology, to recognise iconographical evolution and to redefine the relationships between the coroplasts and the workshops. The major result of the study is the establishment of a chronological system that will make the figurines an important dating instrument for Roman provincial archaeology. Furthermore, we succeeded in defining the economic interaction between Eastern and Central Gaul

    Acceleration of GATE Monte Carlo simulations

    Get PDF
    Positron Emission Tomography (PET) and Single Photon Emission Computed Tomography are forms of medical imaging that produce functional images that reflect biological processes. They are based on the tracer principle. A biologically active substance, a pharmaceutical, is selected so that its spatial and temporal distribution in the body reflects a certain body function or metabolism. In order to form images of the distribution, the pharmaceutical is labeled with gamma-ray-emitting or positron-emitting radionuclides (radiopharmaceuticals or tracers). After administration of the tracer to a patient, an external position-sensitive gamma-ray camera can detect the emitted radiation to form a stack of images of the radionuclide distribution after a reconstruction process. Monte Carlo methods are numerical methods that use random numbers to compute quantities of interest. This is normally done by creating a random variable whose expected value is the desired quantity. One then simulates and tabulates the random variable and uses its sample mean and variance to construct probabilistic estimates. It represents an attempt to model nature through direct simulation of the essential dynamics of the system in question. Monte Carlo modeling is the method of choice for all applications where measurements are not feasible or where analytic models are not available due to the complex nature of the problem. In addition, such modeling is a practical approach in nuclear medical imaging in several important application fields: detector design, quantification, correction methods for image degradations, detection tasks etc. Several powerful dedicated Monte Carlo simulators for PET and/or SPECT are available. However, they are often not detailed nor flexible enough to enable realistic simulations of emission tomography detector geometries while also modeling time dependent processes such as decay, tracer kinetics, patient and bed motion, dead time or detector orbits. Our Monte Carlo simulator of choice, GEANT4 Application for Tomographic Emission (GATE), was specifically designed to address all these issues. The flexibility of GATE comes at a price however. The simulation of a simple prototype SPECT detector may be feasible within hours in GATE but an acquisition with a realistic phantom may take years to complete on a single CPU. In this dissertation we therefore focus on the Achilles’ heel of GATE: efficiency. Acceleration of GATE simulations can only be achieved through a combination of efficient data analysis, dedicated variance reduction techniques, fast navigation algorithms and parallelization. In the first part of this dissertation we consider the improvement of the analysis capabilities of GATE. The static analysis module in GATE is both inflexible and incapable of storing more detail without introducing a large computational overhead. However, the design and validation of the acceleration techniques in this dissertation requires a flexible, detailed and computationally efficient analysis module. To this end, we develop a new analysis framework capable of analyzing any process, from the decay of isotopes to particle interactions and detections in any detector element for any type of phantom. The evaluation of our framework consists of the assessment of spurious activity in 124I-Bexxar PET and of contamination in 131I-Bexxar SPECT. In the case of PET we describe how our framework can detect spurious coincidences generated by non-pure isotopes, even with realistic phantoms. We show that optimized energy thresholds, which can readily be applied in the clinic, can now be derived in order to minimize the contamination. We also show that the spurious activity itself is not spatially uniform. Therefore standard reconstruction and correction techniques are not adequate. In the case of SPECT we describe how it is now possible to classify detections into geometric detections, phantom scatter, penetration through the collimator, collimator scatter and backscatter in the end parts. We show that standard correction algorithms such as triple energy window correction cannot correct for septal penetration. We demonstrate that 124I PET with optimized energy thresholds offer better image quality than 131I SPECT when using standard reconstruction techniques. In the second part of this dissertation we focus on improving the efficiency of GATE with a variance reduction technique called Geometrical Importance Sampling (GIS). We describe how only 0.02% of all emitted photons can reach the crystal surface of a SPECT detector head with a low energy high resolution collimator. A lot of computing power is therefore wasted by tracking photons that will not contribute to the result. A twofold strategy is used to solve this problem: GIS employs Russian Roulette to discard those photons that will not likely contribute to the result. Photons in more important regions on the other hand are split into several photons with reduced weight to increase their survival chance. We show that this technique introduces branches into the particle history. We describe how this can be taken into account by a particle history tree that is used for the analysis of the results. The evaluation of GIS consists of energy spectra validation, spatial resolution and sensitivity for low and medium energy isotopes. We show that GIS reaches acceleration factors between 5 and 13 over analog GATE simulations for the isotopes in the study. It is a general acceleration technique that can be used for any isotope, phantom and detector combination. Although GIS is useful as a safe and accurate acceleration technique, it cannot deliver clinically acceptable simulation times. The main reason lies in its inability to force photons in a specific direction. In the third part of this dissertation we solve this problem for 99mTc SPECT simulations. Our approach is twofold. Firstly, we introduce two variance reduction techniques: forced detection (FD) and convolution-based forced detection (CFD) with multiple projection sampling (MPS). FD and CFD force copies of photons at decay and at every interaction point to be transported through the phantom in a direction sampled within a solid angle toward the SPECT detector head at all SPECT angles simultaneously. We describe how a weight must be assigned to each photon in order to compensate for the forced direction and non-absorption at emission and scatter. We show how the weights are calculated from the total and differential Compton and Rayleigh cross sections per electron with incorporation of Hubbell’s atomic form factor. In the case of FD all detector interactions are modeled by Monte Carlo, while in the case of CFD the detector is modeled analytically. Secondly, we describe the design of an FD and CFD specialized navigator to accelerate the slow tracking algorithms in GEANT4. The validation study shows that both FD and CFD closely match the analog GATE simulations and that we can obtain an acceleration factor between 3 (FD) and 6 (CFD) orders of magnitude over analog simulations. This allows for the simulation of a realistic acquisition with a torso phantom within 130 seconds. In the fourth part of this dissertation we exploit the intrinsic parallel nature of Monte Carlo simulations. We show how Monte Carlo simulations should scale linearly as a function of the number of processing nodes but that this is usually not achieved due to job setup time, output handling and cluster overhead. We describe how our approach is based on two steps: job distribution and output data handling. The job distribution is based on a time-domain partitioning scheme that retains all experimental parameters and that guarantees the statistical independence of each subsimulation. We also reduce the job setup time by the introduction of a parameterized collimator model for SPECT simulations. We reduce the data output handling time by a chain-based output merger. The scalability study is based on a set of simulations on a 70 CPU cluster and shows an acceleration factor of approximately 66 on 70 CPUs for both PET and SPECT.We also show that our method of parallelization does not introduce any approximations and that it can be readily combined with any of the previous acceleration techniques described above

    Characterizing the parallax error in multi-pinhole micro-SPECT reconstruction

    Get PDF
    The usage of pinholes is very important in preclinical micro-SPECT. Pinholes can magnify the object onto the detector, resulting in better system resolutions than the detector resolution. The loss in sensitivity is usually countered by adding more pinholes, each projecting onto a specific part of the detector. As a result, gamma rays have an oblique incidence to the detector. This causes displacement and increased uncertainty in the position of the interaction of the gamma ray in the detector, also known as parallax errors or depth-of-interaction (DOI) errors. This in turn has a large influence on image reconstruction algorithms using ray tracers as a forward projector model, as the end-point of each ray on the detector has to be accurately known. In this work, we used GATE to simulate the FLEX Triumph-I system (Gamma Medica-Ideas, Northridge, CA), a CZT-based multi-pinhole micro-SPECT system. This system uses 5 mm thick CZT pixels, with 1.5 mm pixel pitch. The simulated information was then used to enhance the image resolution by accurately modeling the DOI. Two hundred point sources were simulated and rebinned to use the DOI information. This data was then used in a GPU-based iterative reconstruction algorithm taking the simulated DOI into account. The average displacement was then determined for all point sources, and the FWHM was calculated in three dimensions, by fitting the point sources with 3D Gaussians. We show that the displacement is reduced by 83% on average. We also show a 15% resolution gain when only 5 DOI levels are used

    Region-based motion-compensated iterative reconstruction technique for dynamic computed tomography

    Full text link
    Current state-of-the-art motion-based dynamic computed tomography reconstruction techniques estimate the deformation by considering motion models in the entire object volume although occasionally the proper change is local. In this article, we address this issue by introducing the region-based Motion-compensated Iterative Reconstruction Technique (rMIRT). It aims to accurately reconstruct the object being locally deformed during the scan, while identifying the deformed regions consistently with the motion models. Moreover, the motion parameters that correspond to the deformation in those areas are also estimated. In order to achieve these goals, we consider a mathematical optimization problem whose objective function depends on the reconstruction, the deformed regions and the motion parameters. The derivatives towards all of them are formulated analytically, which allows for efficient reconstruction using gradient-based optimizers. To the best of our knowledge, this is the first iterative reconstruction method in dynamic CT that exploits the analytical derivative towards the deformed regions.Comment: Accepted at ISBI 202

    Deurne - Eksterlaar, deelrapport 2 Archeologisch vooronderzoek: proefsleuvenonderzoek langs de Kerkhofweg

    Get PDF
    Dit rapport werd ingediend bij het agentschap samen met een aantal afzonderlijke digitale bijlagen. Een aantal van deze bijlagen zijn niet inbegrepen in dit pdf document en zijn niet online beschikbaar. Sommige bijlagen (grondplannen, fotos, spoorbeschrijvingen, enz.) kunnen van belang zijn voor een betere lezing en interpretatie van dit rapport. Indien u deze bijlagen wenst te raadplegen kan u daarvoor contact opnemen met: [email protected]

    Antwerpen Noordersingel 3: Resten van de caponnière op front 4-5 van de Brialmontomwalling Eindverslag van een toevalsvondst.

    Get PDF
    Bij de uitbreiding van een slibverswerkingsinstallatie van de RWZI aan de Noordersingel 1-3 in Antwerpen, stootten arbeiders op de muurresten van de 19de-eeuwse Grote Omwalling, beter gekend als Brialmont omwalling. In het kader van deze toevalsvondst volgde een archeologisch onderzoek om de resten in kaart te brengen en de bouwtechnische kenmerken van de constructie te bestuderen

    Lunet F - Fort Burcht IV Antwerpen Beatrijslaan Archeologisch en bouwhistorisch onderzoek

    Get PDF
    Dit rapport werd ingediend bij het agentschap samen met een aantal afzonderlijke digitale bijlagen. Een aantal van deze bijlagen zijn niet inbegrepen in dit pdf document en zijn niet online beschikbaar. Sommige bijlagen (grondplannen, fotos, spoorbeschrijvingen, enz.) kunnen van belang zijn voor een betere lezing en interpretatie van dit rapport. Indien u deze bijlagen wenst te raadplegen kan u daarvoor contact opnemen met: [email protected]
    • …
    corecore